Self-Supervised Continual Graph Learning in Adaptive Riemannian Spaces
نویسندگان
چکیده
Continual graph learning routinely finds its role in a variety of real-world applications where the data with different tasks come sequentially. Despite success prior works, it still faces great challenges. On one hand, existing methods work zero-curvature Euclidean space, and largely ignore fact that curvature varies over com- ing sequence. other continual learners literature rely on abundant labels, but labeling practice is particularly hard especially for continuously emerging graphs on-the-fly. To address aforementioned challenges, we propose to explore challenging yet practical problem, self-supervised adaptive Riemannian spaces. In this paper, novel Graph Learner (RieGrace). RieGrace, first design an Adaptive GCN (AdaRGCN), unified coupled neural adapter, so space shaped by learnt each graph. Then, present Label-free Lorentz Distillation approach, which create teacher-student AdaRGCN The student successively performs intra-distillation from itself inter-distillation teacher as consolidate knowledge without catastrophic forgetting. particular, theoretically grounded Generalized Projection contrastive distillation space. Extensive experiments benchmark datasets show superiority additionally, investigate how changes
منابع مشابه
Supervised Learning of Graph Structure
Graph-based representations have been used with considerable success in computer vision in the abstraction and recognition of object shape and scene structure. Despite this, the methodology available for learning structural representations from sets of training examples is relatively limited. In this paper we take a simple yet effective Bayesian approach to attributed graph learning. We present...
متن کاملGraph-Based Semi-Supervised Learning
While labeled data is expensive to prepare, ever increasing amounts of unlabeled data is becoming widely available. In order to adapt to this phenomenon, several semi-supervised learning (SSL) algorithms, which learn from labeled as well as unlabeled data, have been developed. In a separate line of work, researchers have started to realize that graphs provide a natural way to represent data in ...
متن کاملGraph attribute embedding via Riemannian submersion learning
In this paper, we tackle the problem of embedding a set of relational structures into a metric space for purposes of matching and categorisation. To this end, we view the problem from a Riemannian perspective and make use of the concepts of charts on the manifold to define the embedding as a mixture of class-specific submersions. Formulated in this manner, the mixture weights are recovered usin...
متن کاملAdaptive Sparseness for Supervised Learning
The goal of supervised learning is to infer a functional mapping based on a set of training examples. To achieve good generalization, it is necessary to control the “complexity” of the learned function. In Bayesian approaches, this is done by adopting a prior for the parameters of the function being learned. We propose a Bayesian approach to supervised learning, which leads to sparse solutions;...
متن کاملNeural Learning in Structured Parameter Spaces - Natural Riemannian Gradient
The parameter space of neural networks has a Riemannian metric structure. The natural Riemannian gradient should be used instead of the conventional gradient, since the former denotes the true steepest descent direction of a loss function in the Riemannian space. The behavior of the stochastic gradient learning algorithm is much more effective if the natural gradient is used. The present paper ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Proceedings of the ... AAAI Conference on Artificial Intelligence
سال: 2023
ISSN: ['2159-5399', '2374-3468']
DOI: https://doi.org/10.1609/aaai.v37i4.25586